# T5 Architecture
T5 Small Finetuned Xsum
Apache-2.0
A text summarization model fine-tuned on the XSum dataset based on T5-small
Text Generation
Transformers

T
bdwjaya
103
0
Kazrush Ru Kk
Apache-2.0
kazRush-ru-kk is a Russian-to-Kazakh translation model based on the T5 configuration, trained on multiple open-source parallel datasets.
Machine Translation
Transformers Other

K
deepvk
332
8
Rut5 Base Summ Dialogsum
A Russian dialogue summarization model fine-tuned on d0rj/rut5-base-summ, trained on the DialogSum dataset
Text Generation
Transformers

R
Kekega
15
1
T5 Base Emojilm
A pre-trained model based on the T5 architecture that can convert English sentences into emoji sequences
Text Generation
Transformers English

T
KomeijiForce
406
4
Long T5 Base Govreport
Apache-2.0
A government report summarization model based on the Long-T5 architecture, specifically optimized for long document summarization tasks
Text Generation
Transformers English

L
AleBurzio
866
2
Comet Atomic En
An English event reasoning model based on the T5 architecture, used to analyze event prerequisites, effects, intentions, and reactions
Large Language Model
Transformers English

C
svjack
319
3
T5 Grammar Corruption
Apache-2.0
A grammar correction model fine-tuned based on the t5-base model for detecting and correcting grammatical errors in text
Machine Translation
Transformers

T
juancavallotti
19
1
Molt5 Small
Apache-2.0
MOLT5-small is a molecular and natural language conversion model based on a pre - trained model, which can realize the mutual conversion between molecular structures and natural language descriptions.
Molecular Model
Transformers

M
laituan245
443
2
Molt5 Base
Apache-2.0
molt5-base is a model based on the T5 architecture, specifically designed for translation tasks between molecules and natural language.
Machine Translation
Transformers

M
laituan245
3,617
1
T5 Weighter Cnndm En
MIT
A classifier based on the T5-small architecture, used to evaluate the importance of answer/question pairs and determine whether the answer is sufficiently relevant to appear in a summary.
Question Answering System Supports Multiple Languages
T
ThomasNLG
178
0
Ptt5 Large T5 Vocab
MIT
PTT5 is a T5 model pretrained on the BrWac corpus, specifically optimized for Portuguese, offering multiple sizes and vocabulary choices.
Large Language Model
Transformers Other

P
unicamp-dl
45
2
T5 Paraphrase Paws
English sentence paraphrase generation model based on T5 architecture, trained using Google PAWS dataset
Text Generation English
T
Vamsi
67.42k
38
Indot5 Small
A T5-small model pretrained on the Indonesian mC4 dataset, requires fine-tuning before use
Large Language Model
Transformers Other

I
Wikidepia
83
0
Gec T5 Small
Apache-2.0
A grammar error correction model based on the T5-small architecture, designed to automatically detect and correct grammatical errors in English text.
Large Language Model
Transformers English

G
Unbabel
4,285
25
T5 Qa Squad2neg En
MIT
A Q&A system model based on T5-small architecture, supporting extractive Q&A tasks and capable of handling both answerable and unanswerable scenarios
Question Answering System Supports Multiple Languages
T
ThomasNLG
533
0
T5 Paraphrase Paws Msrp Opinosis
Apache-2.0
This is a paraphrase model based on the T5-base architecture, fine-tuned on the PAWS, MSRP, and Opinosis datasets, primarily used for text paraphrasing and rewriting tasks.
Machine Translation English
T
ceshine
77
3
Kgt5 Wikikg90mv2
MIT
T5 model trained on the WikiKG90Mv2 dataset for tail entity prediction tasks in knowledge graphs
Knowledge Graph
Transformers

K
apoorvumang
22
1
T5 Base Cnn Dm
Apache-2.0
This model is a fine-tuned T5 architecture for news summarization, trained on the CNN/DailyMail news dataset.
Text Generation English
T
flax-community
808
1
Code Trans T5 Base Api Generation Multitask Finetune
A pre-trained model based on the T5 architecture, specifically designed for Java API recommendation generation tasks, optimized through multi-task training
Large Language Model
C
SEBIS
16
0
Query Gen Msmarco T5 Base V1
A query generation model based on the T5-base architecture, used to generate potential search queries for text passages, enabling the training of semantic search models without labeled data
Text Generation
Q
BeIR
417
17
Code Trans T5 Small Api Generation Multitask Finetune
Java API recommendation generation model pre-trained on T5-small architecture, optimized through multi-task training and fine-tuning
Large Language Model
C
SEBIS
22
0
Ke T5 Base Ko
Apache-2.0
KE-T5 is a Korean-English bilingual text generation model based on the T5 architecture, developed by the Korea Electronics Technology Institute, supporting cross-lingual knowledge transfer for dialogue generation tasks.
Large Language Model Korean
K
KETI-AIR
208
9
Featured Recommended AI Models